Let's start. Hello everybody. Can everybody hear me? Because I don't know if this microphone
is working or not. Okay, let's see. So the lecture will be recorded and so I hope this
microphone works. Okay, good. So hello everybody. I'm Vincent Christlanser. I'm also on that
slide here in the long list of slide authors. And I'm in replacement here for Andreas Meyer,
who is currently on a conference trip and also next week probably he won't be here because
he has a health issue. Okay, so we are actually quite a big team. So some of them are also
here. Maybe you can just come quickly in front. So some of the tutors are here. Katharina
Breininger is I think also on conference trip. Suleyman is, can't see him. Okay. So the bottom
row is basically from our lab and the top row are apart from Andreas Meyer of course
are student tutors who will help you in the tutorials. Yeah, Noah and Florian are currently
here and the others you will get to know and will meet in the tutorials. Thanks. It was
just a short time so that you know their faces. Okay, yes. Good. Yeah, so as I said I'm currently
replacing Andreas Meyer. I'm also a postdoc at the lab and so for the next week probably
maybe also in two weeks I will replace him. And if Andreas cannot come then typically
Katharina Breininger will replace him or I will replace him so that not too many changing
faces are here. Okay, so you probably all heard of deep learning I guess at least. Otherwise
you won't be here. And it's always difficult to characterize or where actually to put deep
learning in terms of related fields. So big data is another big password, artificial intelligence,
machine learning, representation learning. And deep learning is basically, yeah, it needs
big data, it's a part of artificial intelligence. It's of course not only, so it's not artificial
intelligence, it's not a subset of deep learning. And machine and deep learning is also a subset
of machine learning and representation learning. So don't get confused if someone talks about
AI, so all these big companies, it's often not AI but just simple, normal pattern recognition
or what do you mean normal, yeah. It's also great but it can also be a logistic regression
or something like that, yeah. So however all these terms they change over time, nowadays
everybody talks about artificial intelligence again but actually means deep learning. So
yeah, it's a little bit complicated also for us as researchers. How do we frame now our
work? Is it now artificial intelligence? I think it's always difficult and not easy also.
For the lab for example, yeah, we had this old artificial intelligence which worked on
symbolic representations and then we had the pattern recognition on the other side and
machine learning and now it's again artificial intelligence which means completely something
completely different. Then you can characterize deep learning of course by the tasks, so classification
task, we have segmentation task, regression generation, detection task whatsoever and
we have different categories, so supervised versus unsupervised learning and neural networks
are of course a basic part of deep learning. Deep learning nearly always contains neural
networks because there you have these deep layers, so multiple layers and they are typically
represented by neural networks and we have features, of course feature learning is also
a part of deep learning. Okay, so today's lecture we will have a small or well quite big motivation
actually so all these slides are today. The main part is about the motivation and in general
what we are doing at our pattern recognition lab and then we come a little bit theory but
it's really tiny today, so the big theory part will be next week and yeah and then we
have some in the end we have some organizational matters about the tutorials and the lecture
and the exams and so on. Okay, let's start with a small motivation. I don't know, is
any one of you having stocks? It's a good time actually but yeah I was actually we have
this graph already several times now here in this deep learning lecture and we give
this lecture since 2016-17 and it always raised until about 2019. Okay, and has someone an
idea why it drops here the Nvidia stock market? So Nvidia, biggest GPU producer? Bitcoin,
yes, thank you and very good, exactly. So here we have the drop of Bitcoin, so it's
not all about deep learning, yeah. But of course it is a part here and when I saw these
graphs I always was annoyed by myself why I have not bought some Nvidia stocks. So yeah,
Presenters
Zugänglich über
Offener Zugang
Dauer
01:08:08 Min
Aufnahmedatum
2019-10-15
Hochgeladen am
2019-10-15 17:09:04
Sprache
en-US
-
(multilayer) perceptron, backpropagation, fully connected neural networks
-
loss functions and optimization strategies
-
convolutional neural networks (CNNs)
-
activation functions
-
regularization strategies
-
common practices for training and evaluating neural networks
-
visualization of networks and results
-
common architectures, such as LeNet, Alexnet, VGG, GoogleNet
-
recurrent neural networks (RNN, TBPTT, LSTM, GRU)
-
deep reinforcement learning
-
unsupervised learning (autoencoder, RBM, DBM, VAE)
-
generative adversarial networks (GANs)
-
weakly supervised learning
-
applications of deep learning (segmentation, object detection, speech recognition, ...)
The accompanying exercises will provide a deeper understanding of the workings and architecture of neural networks.